Skip to content

Conversation

@kmodexc
Copy link

@kmodexc kmodexc commented Sep 12, 2024

This MR is adding evalutation of confidence misscalibration. Confidence calibration is a topic investigated by i.e. Kato and Kato and others and is gaining relevace. Since its just adding an additional metric to the evaluation script its not to much for those not interested in this metric. Alternatively i could also disable it by a configuration in the config file.

@Petros626
Copy link

@kmodexc can you provide any results comparison?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants